model and feature diversity
Model and Feature Diversity for Bayesian Neural Networks in Mutual Learning
However, they often underperform compared to deterministic neural networks. Utilizing mutual learning can effectively enhance the performance of peer BNNs. In this paper, we propose a novel approach to improve BNNs performance through deep mutual learning. The proposed approaches aim to increase diversity in both network parameter distributions and feature distributions, promoting peer networks to acquire distinct features that capture different characteristics of the input, which enhances the effectiveness of mutual learning. Experimental results demonstrate significant improvements in the classification accuracy, negative log-likelihood, and expected calibration error when compared to traditional mutual learning for BNNs.
Model and Feature Diversity for Bayesian Neural Networks in Mutual Learning Supplementary Material
We also test the direct maximization of Kullback-Leibler (KL) divergence between feature distributions. As presented in Table A.1, the direct maximization of Direct maximize KL divergence between feature distributions. We further conduct ablation studies focusing on directly maximizing the Kullback-Leibler (KL) divergence between feature distributions of peer Bayesian neural networks (as in setting d in Table A.1). Table A.2, the results for both ResNet20 and ResNet32 BNN models demonstrate that using optimal "*" means Bayesian neural networks that are initialized with the mean value from the pre-trained The results are shown in Table A.3. Figure A.1: Comparison of optimal transport distance between the parameter distributions of peer A.1, it is clear that our proposed method, which promotes A.2, it is clear that our proposed method, which promotes diversity in the feature
- Oceania > Australia > South Australia > Adelaide (0.05)
- Europe > United Kingdom > England > Surrey (0.05)
- Asia > Vietnam (0.05)
Model and Feature Diversity for Bayesian Neural Networks in Mutual Learning
However, they often underperform compared to deterministic neural networks. Utilizing mutual learning can effectively enhance the performance of peer BNNs. In this paper, we propose a novel approach to improve BNNs performance through deep mutual learning. The proposed approaches aim to increase diversity in both network parameter distributions and feature distributions, promoting peer networks to acquire distinct features that capture different characteristics of the input, which enhances the effectiveness of mutual learning. Experimental results demonstrate significant improvements in the classification accuracy, negative log-likelihood, and expected calibration error when compared to traditional mutual learning for BNNs.
Model and Feature Diversity for Bayesian Neural Networks in Mutual Learning
Pham, Cuong, Nguyen, Cuong C., Le, Trung, Phung, Dinh, Carneiro, Gustavo, Do, Thanh-Toan
Bayesian Neural Networks (BNNs) offer probability distributions for model parameters, enabling uncertainty quantification in predictions. However, they often underperform compared to deterministic neural networks. Utilizing mutual learning can effectively enhance the performance of peer BNNs. In this paper, we propose a novel approach to improve BNNs performance through deep mutual learning. The proposed approaches aim to increase diversity in both network parameter distributions and feature distributions, promoting peer networks to acquire distinct features that capture different characteristics of the input, which enhances the effectiveness of mutual learning. Experimental results demonstrate significant improvements in the classification accuracy, negative log-likelihood, and expected calibration error when compared to traditional mutual learning for BNNs.